Even if we could perform operations infinitely fast, there would still
be a limit to our computing power. During execution, algorithms
need working storage to keep track of their ongoing calculations.
This consumes computer memory, which is not infinite.
The measure for the working storage an algorithm needs is
called space complexity. Space complexity analysis is similar to
time complexity analysis. The difference is that we count computer
memory, and not computing operations. We observe how space
complexity evolves when the algorithm’s input size grows, just as
we do for time complexity.
For example, Selection Sort (sec. 2.1) just needs working stor-
age for a fixed set of variables. The number of variables does not
depend on the input size. Therefore, we say Selection Sort’s space
complexity is O(1): no matter what the input size, it requires the
same amount of computer memory for working storage.
However, many other algorithms need working storage that
grows with input size. Sometimes, it’s impossible to meet an al-
gorithm’s memory requirements. You won’t find an appropriate
sorting algorithm with O(n log n) time complexity and O(1) space
complexity. Computer memory limitations sometimes force a trade-
off. With low memory, you’ll probably need an algorithm with slow
O(n
2
) time complexity because it has O(1) space complexity. In
future chapters, we’ll see how clever data handling can improve
space complexity.
In this chapter, we learned algorithms can have different types of vo-
racity for consuming computing time and computer memory. We’ve
seen how to assess it with time and space complexity analysis. We
learned to calculate time complexity by finding the exact T(n) func-
tion, the number of operations performed by an algorithm.
We’ve seen how to express time complexity using the Big-O no-
tation (O). Throughout this book, we’ll perform simple time com-
plexity analysis of algorithms using this notation. Many times, cal-